翻訳と辞書
Words near each other
・ Hidden Lake (horse)
・ Hidden Lake (Sawtooth Wilderness)
・ Hidden Lake (Skagit County, Washington)
・ Hidden Lake (Vancouver Island)
・ Hidden Lake (White Cloud Mountains)
・ Hidden Lake Academy
・ Hidden Lake Airport
・ Hidden Lake Formation
・ Hidden Lake Gardens
・ Hidden Lake Winery
・ Hidden Lake, Colorado
・ Hidden Lakes (Nevada)
・ Hidden line removal
・ Hidden Lives
・ Hidden Love
Hidden Markov model
・ Hidden Markov random field
・ Hidden Meadows, California
・ Hidden message
・ Hidden Mickey
・ Hidden Mountain
・ Hidden Nations, Enduring Crimes conference
・ Hidden node problem
・ Hidden Orchestra
・ Hidden oscillation
・ Hidden Palms
・ Hidden Passions
・ Hidden Path Entertainment
・ Hidden pebblesnail
・ Hidden personality


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Hidden Markov model : ウィキペディア英語版
Hidden Markov model

A hidden Markov model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved (''hidden'') states. A HMM can be presented as the simplest dynamic Bayesian network. The mathematics behind the HMM were developed by L. E. Baum and coworkers. It is closely related to an earlier work on the optimal nonlinear filtering problem by Ruslan L. Stratonovich, who was the first to describe the forward-backward procedure.
In simpler Markov models (like a Markov chain), the state is directly visible to the observer, and therefore the state transition probabilities are the only parameters. In a ''hidden'' Markov model, the state is not directly visible, but output, dependent on the state, is visible. Each state has a probability distribution over the possible output tokens. Therefore the sequence of tokens generated by an HMM gives some information about the sequence of states. The adjective 'hidden' refers to the state sequence through which the model passes, not to the parameters of the model; the model is still referred to as a 'hidden' Markov model even if these parameters are known exactly.
Hidden Markov models are especially known for their application in temporal pattern recognition such as speech, handwriting, gesture recognition,〔Thad Starner, Alex Pentland. (Real-Time American Sign Language Visual Recognition From Video Using Hidden Markov Models ). Master's Thesis, MIT, Feb 1995, Program in Media Arts〕 part-of-speech tagging, musical score following,〔B. Pardo and W. Birmingham. (Modeling Form for On-line Following of Musical Performances ). AAAI-05 Proc., July 2005.〕 partial discharges〔Satish L, Gururaj BI (April 2003). "(Use of hidden Markov models for partial discharge pattern classification )". ''IEEE Transactions on Dielectrics and Electrical Insulation''.〕 and bioinformatics.
A hidden Markov model can be considered a generalization of a mixture model where the hidden variables (or latent variables), which control the mixture component to be selected for each observation, are related through a Markov process rather than independent of each other. Recently, hidden Markov models have been generalized to pairwise Markov models and triplet Markov models which allow consideration of more complex data structures 〔(Pr. Pieczynski ), W. Pieczynski, Multisensor triplet Markov chains and theory of evidence, International Journal of Approximate Reasoning, Vol. 45, No. 1, pp. 1-16, 2007.〕〔(Boudaren et al. ), M. Y. Boudaren, E. Monfrini, W. Pieczynski, and A. Aissani, Dempster-Shafer fusion of multisensor signals in nonstationary Markovian context, EURASIP Journal on Advances in Signal Processing, No. 134, 2012.〕 and the modelling of nonstationary data.〔(Lanchantin et al. ), P. Lanchantin and W. Pieczynski, Unsupervised restoration of hidden non stationary Markov chain using evidential priors, IEEE Trans. on Signal Processing, Vol. 53, No. 8, pp. 3091-3098, 2005.〕〔(Boudaren et al. ), M. Y. Boudaren, E. Monfrini, and W. Pieczynski, Unsupervised segmentation of random discrete data hidden with switching noise distributions, IEEE Signal Processing Letters, Vol. 19, No. 10, pp. 619-622, October 2012.〕
== Description in terms of urns ==

In its discrete form, a hidden Markov process can be visualized as a generalization of the Urn problem with replacement (where each item from the urn is returned to the original urn before the next step).〔 ()〕 Consider this example: in a room that is not visible to an observer there is a genie. The room contains urns X1, X2, X3, … each of which contains a known mix of balls, each ball labeled y1, y2, y3, … . The genie chooses an urn in that room and randomly draws a ball from that urn. It then puts the ball onto a conveyor belt, where the observer can observe the sequence of the balls but not the sequence of urns from which they were drawn. The genie has some procedure to choose urns; the choice of the urn for the ''n''-th ball depends only upon a random number and the choice of the urn for the (''n'' − 1)-th ball. The choice of urn does not directly depend on the urns chosen before this single previous urn; therefore, this is called a Markov process. It can be described by the upper part of Figure 1.
The Markov process itself cannot be observed, and only the sequence of labeled balls can be observed, thus this arrangement is called a "hidden Markov process". This is illustrated by the lower part of the diagram shown in Figure 1, where one can see that balls y1, y2, y3, y4 can be drawn at each state. Even if the observer knows the composition of the urns and has just observed a sequence of three balls, ''e.g.'' y1, y2 and y3 on the conveyor belt, the observer still cannot be ''sure'' which urn (''i.e.'', at which state) the genie has drawn the third ball from. However, the observer can work out other information, such as the likelihood that the third ball came from each of the urns.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Hidden Markov model」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.